entropy is the average amount of information contained in each message received
NO caso de imagens podemos dizer que a entropia de uma imagem trata-se do valor médio de bits necessários para representá-la.
$$H = - \sum\limits_{i,j}log_2 p(I_{ij}) * p(I_{ij})$$sendo então obtidas as probabilidades através da estimação por histograma:
$$H = - \sum\limits_{v}log_2(h_v) * h_v$$Verificamos então a variação da entropia ao reduzir os bits usados na representação de uma imagem:
In [6]:
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
from scipy import ndimage
import sys
import math
sys.path.append('/home/ciro/Documents/USP-2015-1sem/mac0417-visao')
from src.lib import normalize
def get_entropy(f):
"""Obtem a entropia de uma dada imagem"""
bins = np.bincount(img.ravel().astype(np.uint8))
return -np.sum(np.log2(bins) * bins)/float(f.size)
img = ndimage.imread('../assets/cameraman.tif', flatten=True)
plt.figure()
plt.title('imagem original')
plt.axis('off')
plt.imshow(img, cmap="gray")
print "entropy: ", get_entropy(img)
For reducing the bits used to represent an image we might go with IGS (improved grayscale quantization), which is much better than only performing a rounding around the values of the image array.
In [28]:
def gen_ramp(h, w, orientation='h'):
"""Generates a ramp (linear gradient) for a given orientation"""
v, h = np.meshgrid(np.arange(h), np.arange(w), indexing='ij')
return normalize((v, h)[orientation == 'h'])
def reduce_bits_from_img(f, bits=8):
"""Applies a normalization to an image so that
it reduces its range of values"""
return normalize(f, (0, math.pow(2, bits-1))).astype(np.uint8)
new_img = reduce_bits_from_img(gen_ramp(20,20), 4)
plt.figure()
plt.title("7 bits")
plt.axis('off')
plt.imshow(new_img, cmap="gray")
print "entropy: ", get_entropy(new_img)
In [ ]: